An analysis of the exponentiated gradient descent algorithm
نویسندگان
چکیده
This paper analyses three algorithms recently studied in the Computational Learning Theory community: the Gradient Descent (GD) Algorithm, the Exponentiated Gradient Algorithm with Positive and Negative weights (EG algorithm) and the Exponentiated Gradient Algorithm with Unnormalised Positive and Negative weights (EGU algorithm). The analysis is of the form used in the signal processing community and is in terms of the mean square error. A relationship between the learning rate and the mean squared error (MSE) of predictions is found for the family of algorithms. Trials involving simulated acoustic echo cancellation are conducted whereby learning rates for the algorithms are selected such that they converge to the same steady state MSE. These trials demonstrate that, in the case that the target is sparse, the EG algorithm typically converges more quickly than the GD or EGU algorithms which perform very similarly.
منابع مشابه
A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei
In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...
متن کاملExponentiated Gradient LINUCB for Contextual Multi-Armed Bandits
We present Exponentiated Gradient LINUCB, an algorithm for contextual multi-armed bandits. This algorithm uses Exponentiated Gradient to find the optimal exploration of the LINUCB. Within a deliberately designed offline simulation framework we conduct evaluations with real online event log data. The experimental results demonstrate that our algorithm outperforms surveyed algorithms.
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملConvergence of exponentiated gradient algorithms
This paper studies three related algorithms: the (traditional) Gradient Descent (GD) Algorithm, the Exponentiated Gradient Algorithm with Positive and Negative weights (EG algorithm) and the Exponentiated Gradient Algorithm with Unnormalized Positive and Negative weights (EGU algorithm). These algorithms have been previously analyzed using the “mistake-bound framework” in the computational lear...
متن کامل(Exponentiated) Stochastic Gradient Descent for L1 Constrained Problems
This note is by Sham Kakade, Dean Foster, and Eyal Even-Dar. It is intended as an introductory piece on solving L1 constrained problems with online methods. Convex optimization problems with L1 constraints frequently underly solving such tasks as feature selection problems and obtaining sparse representations. This note shows that the exponentiated gradient algorithm (of Kivinen and Warmuth (19...
متن کامل